Nonlinear Dimensionality Reduction for Regression

نویسندگان

  • Minyoung Kim
  • Vladimir Pavlovic
چکیده

The task of dimensionality reduction for regression (DRR) is to find a low dimensional representation z ∈ R of the input covariates x ∈ R, with q p, for regressing the output y ∈ R. DRR can be beneficial for visualization of high dimensional data, efficient regressor design with a reduced input dimension, but also when eliminating noise in data x through uncovering the essential information z for predicting y. However, while dimensionality reduction methods are common in many machine learning tasks (discriminant analysis, graph embedding, metric learning, principal subspace methods) their use in regression settings has not been widespread.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using Manifold Learning for Nonlinear System Identifi- cation

A high-dimensional regression space usually causes problems in nonlinear system identification. However, if the regression data are contained in (or spread tightly around) some manifold, the dimensionality can be reduced. This paper presents a use of dimension reduction techniques to compose a two-step identification scheme suitable for high-dimensional identification problems with manifold-val...

متن کامل

Visualization of Regression Models Using Discriminative Dimensionality Reduction

Although regression models offer a standard tool in machine learning, there exist barely possibilities to inspect a trained model which go beyond plotting the prediction against single features. In this contribution, we propose a general framework to visualize a trained regression model together with the training data in two dimensions. For this purpose, we rely on modern nonlinear dimensionali...

متن کامل

Localized regression on principal manifolds

We consider nonparametric dimension reduction techniques for multivariate regression problems in which the variables constituting the predictor space are strongly nonlinearly related. Specifically, the predictor space is approximated via “local” principal manifolds, based on which a kernel regression is carried out.

متن کامل

Dimensionality reduction based on non-parametric mutual information

In this paper we introduce a supervised linear dimensionality reduction algorithm which finds a projected input space that maximizes the mutual information between input and output values. The algorithm utilizes the recently introduced MeanNN estimator for differential entropy. We show that the estimator is an appropriate tool for the dimensionality reduction task. Next we provide a nonlinear r...

متن کامل

Kernel logistic PLS: A tool for supervised nonlinear dimensionality reduction and binary classification

Kernel logistic PLS” (KL-PLS) is a new tool for supervised nonlinear dimensionality reduction and binary classification. The principles of KL-PLS are based on both PLS latent variables construction and learning with kernels. The KL-PLS algorithm can be seen as a supervised dimensionality reduction (complexity control step) followed by a classification based on logistic regression. The algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008